What does it mean when an AI is "hallucinating"?
We say an AI is “hallucinating” (also known as “confabulating”1
(Source.) The name is correct, but the record time and date are wrong, and Wandratsch’s crossing was done by swimming, rather than walking (or any combination of the two).
As of 2025, even the most powerful LLMs
An AI model that takes in some text and predicts how the text is most likely to continue.
One technique used to mitigate hallucinations is RAG (retrieval-augmented generation).
While the phenomenon has historically usually been called “hallucination”, “confabulation” is arguably the more accurate term, as it just means “making up a story” and doesn’t imply that the AI is reporting false sensory experiences. ↩︎